Advertisement
AI can now code in any language without additional training.

 AI can now code in any language without additional training.

In 2017, a researcher asked: Can AI write most of the code before 2040? Testers are now using OpenAI's GPT-3, which can be coded in any language. Machine - dominated coding is almost at hand.

GPT-3 has received hundreds of billions of words of training, or basically the entire Internet has participated in the training, which is why it can use CSS, JSX, Python and any other language you can name for coding.

In addition, GPT-3 does not need to "train" various language tasks, because its training data is all inclusive. On the contrary, when you give trivial instructions, the network will be limited by the task at hand.

 

Evolution of GPT-n

GPT achieves the most advanced level of language tasks by combining supervised learning with unsupervised pre training (or using the parameters of unsupervised steps as the starting point of supervised steps). Compared with the next generation, GPT is very small. It only uses an 8 CPU machine to train on thousands of books.

GPT-2 greatly expands the content, including 10 times the parameters, and adds more than 10 times the training data. However, this dataset is relatively limited. It uses "at least 3 Karma's Reddit outbound links" for training. GPT-2 is described as a "chameleon like" synthetic text generator, but it is not the most advanced in downstream tasks such as answering questions, summarizing or translating.

GPT-3 is the latest and most powerful tool in the AI world. It has reached the most advanced level in a series of tasks. Its main breakthrough is to eliminate the need for fine-tuning specific tasks. In terms of size, the model has been greatly expanded again, reaching 175 billion parameters, 116 times the scale of its predecessor.

However, GPT-3 does not need training at all (example of zero learning), and a single learning or multiple learning will make it lose its original excellent performance.

Evolution or death

The situation is as follows: the tester is using GPT-3 to generate valid code that can master the necessary cumbersome knowledge, from buttons to data tables, and even to recreate the Google homepage. These examples are all completed through zero sample learning.

In addition to the rapid development of AI, two other major technology trends also make the unstable reality of programming work more severe in the future: no code and AutoML.

No code refers to a visual tool that allows anyone to easily build a new product, whether it is a website, design, data analysis or model. WordPress, Wix and Shopify are examples of codeless tools that enable millions of people to do things themselves without hiring developers or designers.

The second trend is AutoML, namely automatic machine learning, which greatly shortens the time to put AI into production.

Tools such as Apteo combine these trends so that anyone can deploy AI models without any coding skills.

GPT-3 will trigger another wave of codeless and AutoML tools. Many potential employers will choose these tools instead of hiring expensive programmers.

Naturally, the worst skilled programmers will be the first to leave, and experts will enjoy greater job security - just like in any field.

 

Not just code - GPT-3 applies to any language task

To direct GPT-3 to a specific language task, simply provide it with the examples you want to implement. Therefore, you can guide GPT-3 in writing code, as well as in writing poetry, music, social media reviews, or any other text.

For example, if you want to generate a news article, you can enter a title such as "The Methodist Church of America Agrees to Historical Division", and a subtitle such as "People who oppose same-sex marriage will form their own sect".

Entering these, GPT-3 produces the following results:

"After two days of fierce debate, the Methodist Federation has reached a historic split - expected to end with the creation of a new sect. According to the Washington Post, this sect will be" theological and social conservative ". The majority of delegates attending the annual church meeting in May voted to strengthen the ban on the ranking of LGBTQ clergy, and formulated new rules to "discipline" the clergy presiding over same-sex weddings. But those who oppose these measures have a new plan: they say that they will form a separate sect by 2020, calling their church Methodists... "

Only 12% of humans correctly pointed out that this was written by AI. 88% were cheated.

Like humans, GPT-3 can learn new words with just one example. For example, given the context:

"Burringo" is a car with very fast acceleration. An example of a sentence using the word Burringo is:

GPT-3 output:

In our garage, there is a Burringo whose father drives to work every day.

These results are exciting. Keep in mind that the development of AI is inevitable, so any criticism of current performance will be wasted.

 

More than Language - GPT Applied to Images

GPT can write code or anything, but it can also generate images.

How could that be?

The same model architecture can be trained on a sequence of pixels instead of text encoding to generate a new image instead of a new text. In fact, it is so good in this respect that it can compete with top CNN.

I mention this because it shows that GPT (and the next generation) not only has the potential to replace the encoder one day, but also can replace the entire industry due to its universality.

conclusion

The incredible performance of GPT-3 has led many people to believe that super intelligence is closer than we thought, or at least that AI generated code is closer than we thought. It will produce creative, insightful, profound and even beautiful content.